spanish bert
BETO: Spanish BERT
Transformer based models are creating tremendous impact in the space of NLP as they have proven to be effective in a wide range of tasks such as POS tagging, machine translation, named-entity recognition, and a series of text classification tasks. This year saw the introduction to a whole family of transformer-based language models such as BERT, Transformer-XL, and GPT-2, among others. Langauge models, in general, offer desirable properties that can be leveraged in a transfer learning setting where you train a model with large-scale data to learn the properties of language in an unsupervised setting. The resulting model and weights can then be fine-tuned and be applied in low-resourced regimes to address different NLP tasks. In particular, it's exciting to see the use of BERT in different domains such as text classification, text summarization, text generation, and information retrieval.
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.87)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.87)
- Information Technology > Artificial Intelligence > Natural Language > Text Processing (0.62)
- Information Technology > Artificial Intelligence > Natural Language > Information Retrieval (0.62)